1,209 research outputs found

    Self-stabilizing algorithms for Connected Vertex Cover and Clique decomposition problems

    Full text link
    In many wireless networks, there is no fixed physical backbone nor centralized network management. The nodes of such a network have to self-organize in order to maintain a virtual backbone used to route messages. Moreover, any node of the network can be a priori at the origin of a malicious attack. Thus, in one hand the backbone must be fault-tolerant and in other hand it can be useful to monitor all network communications to identify an attack as soon as possible. We are interested in the minimum \emph{Connected Vertex Cover} problem, a generalization of the classical minimum Vertex Cover problem, which allows to obtain a connected backbone. Recently, Delbot et al.~\cite{DelbotLP13} proposed a new centralized algorithm with a constant approximation ratio of 22 for this problem. In this paper, we propose a distributed and self-stabilizing version of their algorithm with the same approximation guarantee. To the best knowledge of the authors, it is the first distributed and fault-tolerant algorithm for this problem. The approach followed to solve the considered problem is based on the construction of a connected minimal clique partition. Therefore, we also design the first distributed self-stabilizing algorithm for this problem, which is of independent interest

    Birds and people in Europe

    Get PDF
    At a regional scale, species richness and human population size are frequently positively correlated across space. Such patterns may arise because both species richness and human density increase with energy availability. If the species-energy relationship is generated through the 'more individuals' hypothesis, then the prediction is that areas with high human densities will also support greater numbers of individuals from other taxa. We use the unique data available for the breeding birds in Europe to test this prediction. Overall regional densities of bird species are higher in areas with more people; species of conservation concern exhibit the same pattern. Avian density also increases faster with human density than does avian biomass, indicating that areas with a higher human density have a higher proportion of small-bodied individuals. The analyses also underline the low numbers of breeding birds in Europe relative to humans, with a median of just three individual birds per person, and 4 g of bird for every kilogram of human

    Evidence for bone grease rendering during the Upper Paleolithic at Vale Boi (Algarve, Portugal)

    Get PDF

    Fast Structuring of Radio Networks for Multi-Message Communications

    Full text link
    We introduce collision free layerings as a powerful way to structure radio networks. These layerings can replace hard-to-compute BFS-trees in many contexts while having an efficient randomized distributed construction. We demonstrate their versatility by using them to provide near optimal distributed algorithms for several multi-message communication primitives. Designing efficient communication primitives for radio networks has a rich history that began 25 years ago when Bar-Yehuda et al. introduced fast randomized algorithms for broadcasting and for constructing BFS-trees. Their BFS-tree construction time was O(Dlog2n)O(D \log^2 n) rounds, where DD is the network diameter and nn is the number of nodes. Since then, the complexity of a broadcast has been resolved to be TBC=Θ(DlognD+log2n)T_{BC} = \Theta(D \log \frac{n}{D} + \log^2 n) rounds. On the other hand, BFS-trees have been used as a crucial building block for many communication primitives and their construction time remained a bottleneck for these primitives. We introduce collision free layerings that can be used in place of BFS-trees and we give a randomized construction of these layerings that runs in nearly broadcast time, that is, w.h.p. in TLay=O(DlognD+log2+ϵn)T_{Lay} = O(D \log \frac{n}{D} + \log^{2+\epsilon} n) rounds for any constant ϵ>0\epsilon>0. We then use these layerings to obtain: (1) A randomized algorithm for gathering kk messages running w.h.p. in O(TLay+k)O(T_{Lay} + k) rounds. (2) A randomized kk-message broadcast algorithm running w.h.p. in O(TLay+klogn)O(T_{Lay} + k \log n) rounds. These algorithms are optimal up to the small difference in the additive poly-logarithmic term between TBCT_{BC} and TLayT_{Lay}. Moreover, they imply the first optimal O(nlogn)O(n \log n) round randomized gossip algorithm

    BMI and All-Cause Mortality in a Population-Based Cohort in Rural South Africa

    Get PDF
    OBJECTIVE: This study evaluates the association between BMI and all-cause and cause-specific mortality in South Africa. METHODS: Prospective, population-based observational cohort data from rural South Africa were analyzed. BMI was measured in 2010. Demographic characteristics were recorded and deaths were verified with verbal autopsy interview. The InterVA-5 tool was used to assign causes of death. HIV testing was conducted annually. Cox proportional hazards models were fit to estimate the effect of BMI on all-cause and cause-specific mortality, accounting for the competing risk of death from other causes. Models were adjusted for sociodemographic characteristics and HIV status, and inverse probability weighting for survey nonparticipation was used. RESULTS: The cohort consisted of 9,728 individuals. In adjusted models, those with BMI of 25.0 to 29.9 kg/m2 or 30.0 to 34.9 kg/m2 had a lower hazard of death (adjusted hazard ratio: 0.80; 95% CI: 0.69-0.92 and adjusted hazard ratio: 0.75; 95% CI: 0.60-0.93, respectively) compared with those with BMI of 18.5 to 24.9 kg/m^{2}. CONCLUSIONS: Individuals in South Africa who meet clinically defined criteria for overweight or obesity had a lower risk of all-cause mortality than those with a normal BMI. These findings were stronger for women and communicable conditions

    Parental stress before, during, and after pediatric stem cell transplantation: a review article

    Get PDF
    Goals of work: Pediatric stem cell transplantation (SCT) is a stressful treatment for children with relapsed or high-risk malignancies, immune deficiencies and certain blood diseases. Parents of children undergoing SCT can experience ongoing stress related to the SCT period. The aim of this article was to present a literature review of articles on parental distress and adaptation before, during, and after SCT and to identify risk and protective factors. Materials and methods: The review was conducted systematically by using PubMed, Web of Science, PsychInfo, and Picarta databases. Eighteen articles met our inclusion criteria: publishing date between January 1, 1990 and January 1, 2009; studies concerning parents of children undergoing SCT; studies examining the psychological adjustment and/or stress reactions of parents as primary outcomes and studies available in English. Main results: Highest levels of parental stress are reported in the period preceding SCT and during the acute phase. Stress levels decrease steadily after discharge in most parents. However, in a subgroup of parents, stress levels still remain elevated post-SCT. Parents most at risk in the longer term display highest levels of stress during the acute phase of the SCT. Conclusions: Psychosocial assessment before SCT, during the acute phase and in the longer term, is necessary to identify parents in need for support and follow-up care

    Energy policies avoiding a tipping point in the climate system

    Get PDF
    Paleoclimate evidence and climate models indicate that certain elements of the climate system may exhibit thresholds, with small changes in greenhouse gas emissions resulting in non-linear and potentially irreversible regime shifts with serious consequences for socio-economic systems. Such thresholds or tipping points in the climate system are likely to depend on both the magnitude and rate of change of surface warming. The collapse of the Atlantic thermohaline circulation (THC) is one example of such a threshold. To evaluate mitigation policies that curb greenhouse gas emissions to levels that prevent such a climate threshold being reached, we use the MERGE model of Manne, Mendelsohn and Richels. Depending on assumptions on climate sensitivity and technological progress, our analysis shows that preserving the THC may require a fast and strong greenhouse gas emission reduction from today's level, with transition to nuclear and/or renewable energy, possibly combined with the use of carbon capture and sequestration systems
    corecore